Rademacher Chaos Complexities for Learning the Kernel Problem
نویسندگان
چکیده
منابع مشابه
Rademacher Chaos Complexities for Learning the Kernel Problem
We develop a novel generalization bound for learning the kernel problem. First, we show that the generalization analysis of the kernel learning problem reduces to investigation of the suprema of the Rademacher chaos process of order 2 over candidate kernels, which we refer to as Rademacher chaos complexity. Next, we show how to estimate the empirical Rademacher chaos complexity by well-establis...
متن کاملBounds for Learning the Kernel: Rademacher Chaos Complexity
In this paper we develop a novel probabilistic generalization bound for regularized kernel learning algorithms. First, we show that generalization analysis of kernel learning algorithms reduces to investigation of the suprema of homogeneous Rademacher chaos process of order two over candidate kernels, which we refer to it as Rademacher chaos complexity. Our new methodology is based on the princ...
متن کاملGeneralization Bounds for Learning the Kernel: Rademacher Chaos Complexity
One of the central issues in kernel methods [5] is the problem of kernel selection (learning). This problem has recently received considerable attention which can range from the width parameter selection of Gaussian kernels to obtaining an optimal linear combination from a set of finite candidate kernels, see [3, 4]. In the latter case, kernel learning problem is often termed multi-kernel learn...
متن کاملTransductive Rademacher Complexities for Learning Over a Graph
Recent investigations [12, 2, 8, 5, 6] and [11, 9] indicate the use of a probabilistic (’learning’) perspective of tasks defined on a single graph, as opposed to the traditional algorithmical (’computational’) point of view. This note discusses the use of Rademacher complexities in this setting, and illustrates the use of Kruskal’s algorithm for transductive inference based on a nearest neighbo...
متن کاملLocal Rademacher Complexities
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of complexity. The estimates we establish give optimal rates and are based on a local and empirical version of Rademacher averages, in the sense that the Rademacher averages are computed from the data, on a subset of functions with small empirical error. We present some applications to classification, ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neural Computation
سال: 2010
ISSN: 0899-7667,1530-888X
DOI: 10.1162/neco_a_00028